Try to get through a conversation about media or search without someone mentioning generative AI or some form of AI-infused technology – dare you.
On Wednesday, Google made six AI-related announcements in one day, including the ability to use images in Bard prompts, additions to its Search Generative Experience (SGE), a virtual clothing try-on feature and two new AI-powered video campaigns for YouTube.
Other than the YouTube campaigns, none of these offerings are directly related to advertising, but there is a common theme: AI underpins it all.
And if you can bet on anything, it’s that ads will eventually make their way more deeply into most of these new experiences.
AI in view
Google has been avidly investing in Performance Max, which uses machine learning to optimize bids and placements automatically based on whatever is most likely to perform, whether that be YouTube, Maps, Gmail, Discover or the Google Display Network.
In late May, Google announced a bunch of additions to Performance Max and Google Ads (its third-party ad tech), including generative AI features to develop and test creative assets.
Advertisers will soon be able to run video view campaigns across multiple YouTube inventory types at the same time, including in-stream, in-feed and YouTube Shorts, based on whatever Google AI considers to be the optimal placement.
According to Google’s early lists, video view campaigns generate 40% more views on average than in-stream skippable cost-per-view campaigns … and no kidding. When people have the option to skip ads, they skip ads.
Google didn’t share a specific timeline for these campaigns, other than to say it will enter a global beta “soon.”
But a second campaign type, “demand gen,” is set to launch globally in beta in August.
Demand gen will allow advertisers to port their best-performing videos and images from across YouTube, Shorts, Discover and Gmail into Google Ads and use them to run campaigns.
For both demand gen and video view campaigns, advertisers will be able to optimize their bidding for conversions and clicks across specific audiences based on the campaign’s goals.
Charting a course
YouTube is already a mature advertising channel, but other newer and more experimental aspects of Google’s search and shopping experience are potentially ripe inventory sources.
Take Google Maps.
Google is busily adding features to Maps, including immersive bike lanes, 3D street views, simulated traffic views and automatically creating the ideal driving route for multiple trips across multiple cities (rather than just mapping a course between multiple destinations).
Although it is already possible to promote store locations and run ads in Maps and on business profiles through Performance Max, it’s not hard to imagine other surface areas for ads, like promos for hotels, restaurants and coffee shops near specific landmarks.
The “AI” in retail
Shopping is the most obvious opportunity for brands.
For example, Google is rolling out a new virtual try-on feature that uses generative AI to show how clothing looks on models that have different hair types, body types, skin types, ethnicities and sizes.
The models are real, but the garment they’re wearing (shirts to start, pants coming soon) is generated into the image. Everlane, Anthropologie, LOFT and H&M are among the first brands to test the feature.
Beyond being more inclusive, better product visualization will also help boost shopping intent online when people can get a better sense of what they’re buying. Customers can also expand and tweak their searches through a machine learning-based process called “guided similarity” to find similar items by selecting inputs from dropdown menus, such as pattern, color or price.
Visual search
Speaking of visualization, Google is also experimenting with Lens, its image-recognition technology.
Rather than using a text-based prompt to create a new image, people can now search using a combination of images and words.
For example, if someone has an image of an orange dress they like, but they want to find it in a different color, they can search using the image they have, then refine their search by typing in “green,” “blue,” “red” or whatever color they’re looking for to find other versions of similar and related items for sale across multiple retailers.
Or, say you have a towel with a cool pattern and you want to find a matching rug. By uploading the towel image and adding the word “rug” in the Lens search bar, Google’s system will pull up rugs that feature the pattern or something similar.
You can even do searches for skin conditions, such as if you have a mole or a rash you’re curious about but are having difficulty describing with words. Apparently, there are 10 billion (with a “b”) searches for hair-, nail- and skin-related conditions on Google every year.
Which does beg the question: Could a skin cream or lotion brand advertise nearby skin-condition-related search results?
And the answer is yes. Shopping ads, which are part of Performance Max, are eligible to show up against relevant product searches on Google Lens.
(Oh, and one other announcement: If you’re feeling chatty, Google is integrating Lens with its AI chatbot, Bard, so people can converse with the bot through a combination of both text prompts and images.)
In the Search Lab
But the most experimental addition to Google’s generative AI toolkit is the Search Generative Experience (SGE), which is one of the first projects to come out of Search Labs.
SGE is an AI-powered version of Google’s search engine that automatically creates a curated summary of search results by pulling in information from multiple sources, including from around the web, user reviews, photos and business profiles.
Users can sign up to request access to SGE through Search Labs, a program that allows them to play around with these new features and share feedback with Google.
Although Google is still in the process of adding new capabilities to SGE, the gist is that people can see AI-generated results in response to informational queries, like, for example, “What are the best vegetarian restaurants in Boston?”
SGE can also work for product-related searches, like “least messy cat litter,” which would list a brief snapshot of each litter’s particular qualities, including places where they’re sold online. (But there are no sponsorship opportunities here as of yet.)
Google is treading carefully when it comes to monetizing SGE, however, because it’s still rather early. But there’s no doubt that, just like with “classic” search, advertising will be very present in Google’s new generative search experience.
In May, Google said it plans to show regular search ads dotted throughout SGE, just like on a traditional search results page.
Google also recently said it will experiment with search, shopping ads and sponsored product recommendations integrated directly into AI-generated search snapshots.
But when will those formats become generally available in the Search Generative Experience? It’s too soon to say. But maybe Bard knows.